Hidden Node Pruning of Multilayer Perceptrons Based on Redundancy Reduction
نویسنده
چکیده
Among many approaches to choosing the proper size of neural networks, one popular approach is to start with an oversized network and then prune it to a smaller size so as to attain better performance with less computational complexity. In this paper, a new hidden node pruning method is proposed based on the redundancy reduction among hidden nodes. The redundancy information is given by correlation coefficients among hidden nodes and this can save computational complexity. Experimental results demonstrate the effectiveness of the proposed method.
منابع مشابه
4 . Multilayer perceptrons and back - propagation
Multilayer feed-forward networks, or multilayer perceptrons (MLPs) have one or several " hidden " layers of nodes. This implies that they have two or more layers of weights. The limitations of simple perceptrons do not apply to MLPs. In fact, as we will see later, a network with just one hidden layer can represent any Boolean function (including the XOR which is, as we saw, not linearly separab...
متن کاملEffect of nonlinear transformations on correlation between weighted sums in multilayer perceptrons
Nonlinear transformation is one of the major obstacles to analyzing the properties of multilayer perceptrons. In this letter, we prove that the correlation coefficient between two jointly Gaussian random variables decreases when each of them is transformed under continuous nonlinear transformations, which can be approximated by piecewise linear functions. When the inputs or the weights of a mul...
متن کاملAdaptive High Order Neural Trees for Pattern Recognition
In this paper, a new classifier, called adaptive high order neural tree (AHNT), is proposed for pattern recognition applications. It is a hierarchical multi-level neural network, in which the nodes are organized into a tree topology. It successively partitions the training set into subsets, assigning each subset to a different child node. Each node can be a first-order or a high order perceptro...
متن کاملActive Learning in Multilayer Perceptrons
We propose an active learning method with hidden-unit reduction, which is devised specially for multilayer perceptrons (MLP). First, we review our active learning method, and point out that many Fisher-information-based methods applied to MLP have a critical problem: the information matrix may be singular. To solve this problem, we derive the singularity condition of an information matrix, and ...
متن کاملImprove an Efficiency of Feedforward Multilayer Perceptrons by Serial Training
The Feedforward Multilayer Perceptrons network is a widely used model in Artificial Neural Network using the backpropagation algorithm for real world data. There are two common ways to construct Feedforward Multilayer Perceptrons network, that is, either taking a large network and then pruning away the irrelevant nodes or starting from a small network and then adding new relevant nodes. An Arti...
متن کامل